Movement and Gesture in Intelligent Interactive Music Systems
نویسندگان
چکیده
This paper introduces a research area on music and interaction by discussing issues on the integration of movement and music languages. As concerns movement, a special focus is given to high-level and qualitative analysis of human movement, starting from previous studies on movement from this viewpoint, such as Laban’s Theory of Effort (Laban and Lawrence 1947), and from KANSEI Information Processing (Camurri 1997; Hashimoto 1997). Typical application scenarios include interactive music, theatre, art and museum installations possibly including robots on stage. Other main research issues include real-time analysis and synthesis of expressivity in music and in movement signals (e.g., dance), multimodal interaction and intelligent interfaces where sound, music, as well as visual and gesture languages, mobile (robotic) scenographies are the components of an integrated multi-level communication. We present this research area and some recent developments by introducing our EyesWeb project and some music projects in which these issues have been faced. The EyesWeb project goals are twofold: from one hand, it aims at exploring and developing models of interaction by extending music language toward gesture and visual languages, with a particular focus on analysing expressive content in gesture and movement, and generating expressive outputs. For example, in EyesWeb we aim at developing methods able to distinguish the different expressive content from two instances of the same movement pattern, e.g., two performances of the same dance fragment. From the other hand, EyesWeb includes the development of an open, visual software for the real-time analysis of fullbody movement and gesture of one or more humans and its use in the scenarios described above. Our research is also inspired by the well-known languages for choreography Laban Theory of Effort (Laban and Lawrence 1947) and Eschol-Wachman and from research in KANSEI Information processing (Hashimoto 1997). We are successfully employing our research in EyesWeb in various music, interactive art and museal applications, shortly surveyed in the final part of the paper.
منابع مشابه
Interactive Systems Design: A KANSEI based Approach
1 This research is partially funded by the EU IST Projects CARE HERE (Creating Aesthetically Resonant Environments for the Handicapped, Elderly and Rehabilitation) no. IST-2001-32729, MEGA (Multisensory Expressive Gesture Applications) no. IST-199920410 (www.megaproject.org), and by National Projects COFIN2000 and CNR project CNRG0024AF “Metodi di analisi dell’espressività nel movimento umano p...
متن کاملMusic and Gesture: Sensor Technologies in Interactive Music and the Theremin based Space Control Systems
This work attempts to give the brief overview of some of the most important, but partially forgotten biomechanical concepts based on the techniques to measure body movement of workers, athletes and musicians, developed in Russia in 20-s. These experimental works and their results are very useful now to study musical performance from the point of view of the development of Interactive Music Syst...
متن کاملMusic via Motion: A distributed framework for interactive multimedia performance
Music performance is closely associated with body movements in many levels for both the instrument player and the listener. Recent studies show the existence of a strong link between bodily movement and conception and perception of music. The MvM (Music via Motion) framework was proposed over 10 years ago now and has been designed for several different scenarios including interactive dance perf...
متن کاملEmbodied music listening and making in context-aware mobile applications: the EU-ICT SAME Project
Music making and listening are a clear example of a human activity that is above all interactive and social, two big challenges for novel HCI paradigms. Nowadays, however, listening to music is usually still a passive, non-interactive experience. Quoting John Sloboda " In highly industrialized societies, we listen to more music, but we make less " [1]. Even modern devices do not allow for inter...
متن کاملGenerative Improv. & Interactive Music Project (GIIMP)
GIIMP addresses the criticism that in many interactive music systems the machine simply reacts. Interaction is addressed by extending Winkler’s [18] model toward adapting Paine’s [10] conversational model of interaction. Realized using commercial tools, GIIMP implements a machine/human generative improvisation system using human gesture input, machine gesture capture, and a gesture mutation mod...
متن کامل